Hierarchical Mixtures of Naive Bayesian Classifiers
نویسنده
چکیده
Naive Bayesian classifiers tend to perform very well on a large number of problem domains, although their representation power is quite limited compared to more sophisticated machine learning algorithms. In this paper we study combining multiple naive Bayesian classifiers by using the hierarchical mixtures of experts system. This novel system, which we call hierarchical mixtures of naive Bayesian classifiers, is compared to a simple naive Bayesian classifier and to using bagging and boosting for combining multiple classifiers. Results on 19 data sets from the UCI repository indicate that the hierarchical mixtures architecture in general outperforms the other methods.
منابع مشابه
A New Hybrid Framework for Filter based Feature Selection using Information Gain and Symmetric Uncertainty (TECHNICAL NOTE)
Feature selection is a pre-processing technique used for eliminating the irrelevant and redundant features which results in enhancing the performance of the classifiers. When a dataset contains more irrelevant and redundant features, it fails to increase the accuracy and also reduces the performance of the classifiers. To avoid them, this paper presents a new hybrid feature selection method usi...
متن کاملHierarchical Classification for Solving Multi-class Problems: A New Approach Using Naive Bayesian Classification
A hierarchical classification ensemble methodology is proposed as a solution to the multi-class classification problem where the output from a collection of classifiers, arranged in a hierarchical manner, are combined to produce a better composite global classification (better than when the classifiers making up the ensemble operate in isolation). A novel topology for arranging the classifiers ...
متن کاملLearning mixtures of polynomials from data using B-spline interpolation
Hybrid Bayesian networks efficiently encode a joint probability distribution over a set of continuous and discrete variables. Several approaches have been recently proposed for working with hybrid Bayesian networks, e.g., mixtures of truncated basis functions, mixtures of truncated exponentials or mixtures of polynomials (MoPs). We present a method for learning MoP approximations of probability...
متن کاملHierarchical Naive Bayes Classifiers for uncertain data
In experimental sciences many classification problems deal with variables with replicated measurements. In this case the replicates are usually summarized by their mean or median. However, such choice does not consider the information about the uncertainty associated with the measurements, thus potentially leading to over or underestimate the probability associated to each classification. In th...
متن کاملSome Issues in the Automatic Classification of U.S. Patents
The classification of U.S. patents poses some special problems due to the enormous size of the corpus, the size and complex hierarchical structure of the classification system, and the size and structure of patent documents. The representation of the complex structure of documents has not received a great deal of previous attention, but we have found it to be an important factor in our work. We...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007